Shrinkage to Smooth Non-convex Cone : Principal Component Analysis as Stein Estimation
نویسندگان
چکیده
In Kuriki and Takemura (1997a) we established a general theory of James-Stein type shrinkage to convex sets with smooth boundary. In this paper we show that our results can be generalized to the case where shrinkage is toward smooth non-convex cones. A primary example of this shrinkage is descriptive principal component analysis, where one shrinks small singular values of the data matrix. Here principal component analysis is interpreted as the problem of estimation of matrix mean and the shrinkage of the small singular values is regarded as shrinkage of the data matrix toward the manifold of matrices of smaller rank.
منابع مشابه
James-Stein type estimator by shrinkage to closed convex set with smooth boundary
We give James-Stein type estimators of multivariate normal mean vector by shrinkage to closed convex set K with smooth or piecewise smooth boundary. The rate of shrinkage is determined by the curvature of boundary of K at the projection point onto K . By considering a sequence of polytopes K j converging to K , we show that a particular estimator we propose is the limit of a sequence of estimat...
متن کاملDifferenced-Based Double Shrinking in Partial Linear Models
Partial linear model is very flexible when the relation between the covariates and responses, either parametric and nonparametric. However, estimation of the regression coefficients is challenging since one must also estimate the nonparametric component simultaneously. As a remedy, the differencing approach, to eliminate the nonparametric component and estimate the regression coefficients, can ...
متن کاملCone-Constrained Principal Component Analysis
Estimating a vector from noisy quadratic observations is a task that arises naturally in many contexts, from dimensionality reduction, to synchronization and phase retrieval problems. It is often the case that additional information is available about the unknown vector (for instance, sparsity, sign or magnitude of its entries). Many authors propose non-convex quadratic optimization problems th...
متن کاملNonparametric Stein-type shrinkage covariance matrix estimators in high-dimensional settings
Estimating a covariance matrix is an important task in applications where the number of variables is larger than the number of observations. In the literature, shrinkage approaches for estimating a high-dimensional covariance matrix are employed to circumvent the limitations of the sample covariance matrix. A new family of nonparametric Stein-type shrinkage covariance estimators is proposed who...
متن کاملREACT Fits to Linear Models and Scatterplots
REACT estimators for the mean of a linear model involve three steps: transforming the model to a canonical form that provides an economical representation of the unknown mean vector, estimating the risks of a class of candidate linear shrinkage estimators, and adaptively selecting the candidate estimator that minimizes estimated risk. When the mean vector is smooth, the desired canonical form o...
متن کامل